S3 Storage Class Analysis is a feature that observes data access patterns over time to help you decide when to transition less frequently accessed objects from the STANDARD storage class to the STANDARD_IA (Infrequent Access) storage class, enabling cost optimization through data-driven lifecycle policy decisions.
Amazon S3 Storage Class Analysis is an analytics feature that helps you analyze storage access patterns to determine when to transition data to the most cost-effective storage class [citation:1]. By observing how frequently your data is accessed over a period of time, it provides insights that help you decide when to move less frequently accessed objects from S3 Standard to S3 Standard-IA (Infrequent Access), which offers lower storage costs [citation:1][citation:4]. This feature is particularly valuable for optimizing storage costs without requiring manual analysis of access patterns.
Storage class analysis observes access patterns for at least 30 days to gather sufficient information before providing recommendations, though initial data visualization begins appearing within 24-48 hours after configuration [citation:1]. The analysis continues to run after the initial result and updates as access patterns change over time [citation:3].
When analyzing infrequently accessed objects, storage class examines filtered object sets grouped by age since upload to S3. It evaluates several factors including objects in the STANDARD storage class larger than 128 KB, average total storage per age group, and average bytes transferred out per age group (not access frequency) [citation:1][citation:3]. Failed GET and PUT requests are not counted in the analysis [citation:1].
Storage class analysis organizes objects into predefined age groups to analyze access patterns [citation:1][citation:3]:
Less than 15 days old
15-29 days old
30-44 days old
45-59 days old
60-74 days old
75-89 days old
90-119 days old
120-149 days old
150-179 days old
180-364 days old
365-729 days old
730 days and older
You can configure storage class analysis to analyze the entire contents of a bucket, or apply filters to group objects for more targeted analysis [citation:1][citation:9]. Filters can be based on common prefixes (simulating folder structures), object tags, or a combination of both [citation:1][citation:9]. You can have up to 1,000 filter configurations per bucket, each providing separate analysis [citation:1].
The feature provides storage usage visualizations in the Amazon S3 console that are updated daily [citation:1][citation:10]. You can also export analysis data to a CSV file stored in an S3 bucket, which can be viewed in spreadsheet applications or analyzed with business intelligence tools like Amazon QuickSight [citation:1][citation:9]. The destination bucket must be in the same AWS region as the bucket being analyzed [citation:9].
Storage class analysis only provides recommendations for transitions from S3 Standard to S3 Standard-IA [citation:1][citation:3]. It does not give recommendations for transitions to S3 One Zone-IA, S3 Glacier, or S3 Glacier Deep Archive storage classes [citation:3][citation:6]. There are costs associated with using storage class analysis; for detailed pricing information, refer to the Management and Insights section of Amazon S3 pricing [citation:1].
The analysis focuses on objects in the STANDARD storage class that are larger than 128 KB [citation:1][citation:3]. Objects smaller than this threshold are not analyzed for transitions [citation:1].